Variational Sequential Monte Carlo
نویسندگان
چکیده
Many recent advances in large scale probabilistic inference rely on variational methods. The success of variational approaches depends on (i) formulating a flexible parametric family of distributions, and (ii) optimizing the parameters to find the member of this family that most closely approximates the exact posterior. In this paper we present a new approximating family of distributions, the variational sequential Monte Carlo (VSMC) family, and show how to optimize it in variational inference. VSMC melds variational inference (VI) and sequential Monte Carlo (SMC), providing practitioners with flexible, accurate, and powerful Bayesian inference. The VSMC family is a variational family that can approximate the posterior arbitrarily well, while still allowing for efficient optimization of its parameters. We demonstrate its utility on state space models, stochastic volatility models for financial data, and deep Markov models of brain neural circuits.
منابع مشابه
Transdimensional sequential Monte Carlo for hidden Markov models using variational Bayes - SMCVB
In this paper we outline a transdimensional sequential Monte Carlo algorithm SMCVB for fitting hidden Markov models. Sequential Monte Carlo (SMC) involves generating a weighted sample of particles from a sequence of probability distributions with the aim of converging to the target Bayesian posterior distribution. SMCVB makes use of variational Bayes (VB) in combination with SMC principles to c...
متن کاملNotes on data assimilation for nonlinear high-dimensional dynamics: stochastic approach
This manuscript is devoted to the attempts on the design of new nonlinear data assimilation schemes. The variational and sequential assimilation methods are reviewed with emphasis on their performances on dealing with nonlinearity and high dimension of the environmental dynamical systems. The nonlinear data assimilation is based on Bayesian formulation and its approximate solutions. Sequential ...
متن کاملConditional mean field
Despite all the attention paid to variational methods based on sum-product message passing (loopy belief propagation, tree-reweighted sum-product), these methods are still bound to inference on a small set of probabilistic models. Mean field approximations have been applied to a broader set of problems, but the solutions are often poor. We propose a new class of conditionally-specified variatio...
متن کاملVariational Gaussian Process State-Space Models
State-space models have been successfully used for more than fifty years in different areas of science and engineering. We present a procedure for efficient variational Bayesian learning of nonlinear state-space models based on sparse Gaussian processes. The result of learning is a tractable posterior over nonlinear dynamical systems. In comparison to conventional parametric models, we offer th...
متن کاملInference and Parameter Estimation in Gamma Chains
We investigate a class of prior models, called Gamma chains, for modelling depedicies in time-frequencyrepresentations of signals. We assume transform coefficients are drawn independently from Gaussians wherethe latent variances are coupled using Markov chains of inverse Gamma random variables. Exact inference isnot feasible but this model class is conditionally conjugate, so st...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2018